Distance Metric Optimization-Driven Neural Network Learning Framework for Pattern Classification

نویسندگان

چکیده

As a novel neural network learning framework, Twin Extreme Learning Machine (TELM) has received extensive attention and research in the field of machine learning. However, TELM is affected by noise or outliers practical applications so that its generalization performance reduced compared to robust algorithms. In this paper, we propose two distance metric optimization-driven twin extreme frameworks for pattern classification, namely, CWTELM FCWTELM. By introducing Welsch loss function capped L2,p-distance metric, our methods reduce effect improve model TELM. addition, efficient iterative algorithms are designed solve challenges brought non-convex optimization problems FCWTELM, theoretically guarantee their convergence, local optimality, computational complexity. Then, proposed with five other classical under different datasets, statistical detection analysis implemented. Finally, conclude algorithm excellent robustness classification performance.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Convex Optimizations for Distance Metric Learning and Pattern Classification

The goal of machine learning is to build automated systems that can classify and recognize complex patterns in data. Not surprisingly, the representation of the data plays an important role in determining what types of patterns can be automatically discovered. Many algorithms for machine learning assume that the data are represented as elements in a metric space. For example, in popular algorit...

متن کامل

Constructive Neural Network Learning Algorithms for Multi-Category Pattern Classification

Constructive learning algorithms offer an approach for incremental construction of potentially near-minimal neural network architectures for pattern classification tasks. Such algorithms help overcome the need for adhoc and often inappropriate choice of network topology in the use of algorithms that search for a suitable weight setting in an otherwise a-priori fixed network architecture. Severa...

متن کامل

Constructive neural-network learning algorithms for pattern classification

Constructive learning algorithms offer an attractive approach for the incremental construction of near-minimal neural-network architectures for pattern classification. They help overcome the need for ad hoc and often inappropriate choices of network topology in algorithms that search for suitable weights in a priori fixed network architectures. Several such algorithms are proposed in the litera...

متن کامل

Distance Metric Learning with Eigenvalue Optimization

The main theme of this paper is to develop a novel eigenvalue optimization framework for learning a Mahalanobis metric. Within this context, we introduce a novel metric learning approach called DML-eig which is shown to be equivalent to a well-known eigenvalue optimization problem called minimizing the maximal eigenvalue of a symmetric matrix (Overton, 1988; Lewis and Overton, 1996). Moreover, ...

متن کامل

Distance Metric Learning Through Convex Optimization

We present a survey of recent work on the problem of learning a distance metric in the framework of semidefinite programming (SDP). Along with a brief theoretical background on convex optimization and distance metrics, we present various methods developed in this context under different approaches and provide theoretical analysis for a subset of them. A gradient ascent projection algorithm (Xin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Axioms

سال: 2023

ISSN: ['2075-1680']

DOI: https://doi.org/10.3390/axioms12080765